当许多机器人必须在狭窄的空间中一起工作时,可以通过向前时间窗口进行精确的协调计划,可以安全,高效的运动,但这通常需要对所有设备的集中控制,这很难扩展。我们演示了GBP计划,这是一种基于高斯信念传播的多机器人计划问题的新型纯粹分布技术,该技术由定义动态和碰撞约束的通用因素图制成。在模拟中,我们表明我们的方法允许极高的性能协作计划,在繁忙,复杂的场景中,机器人能够互相交叉。即使在沟通失败的情况下,它们也比替代分布式计划技术保持更短,更快,更光滑的轨迹。
translated by 谷歌翻译
Multi-lingual language models (LM), such as mBERT, XLM-R, mT5, mBART, have been remarkably successful in enabling natural language tasks in low-resource languages through cross-lingual transfer from high-resource ones. In this work, we try to better understand how such models, specifically mT5, transfer *any* linguistic and semantic knowledge across languages, even though no explicit cross-lingual signals are provided during pre-training. Rather, only unannotated texts from each language are presented to the model separately and independently of one another, and the model appears to implicitly learn cross-lingual connections. This raises several questions that motivate our study, such as: Are the cross-lingual connections between every language pair equally strong? What properties of source and target language impact the strength of cross-lingual transfer? Can we quantify the impact of those properties on the cross-lingual transfer? In our investigation, we analyze a pre-trained mT5 to discover the attributes of cross-lingual connections learned by the model. Through a statistical interpretation framework over 90 language pairs across three tasks, we show that transfer performance can be modeled by a few linguistic and data-derived features. These observations enable us to interpret cross-lingual understanding of the mT5 model. Through these observations, one can favorably choose the best source language for a task, and can anticipate its training data demands. A key finding of this work is that similarity of syntax, morphology and phonology are good predictors of cross-lingual transfer, significantly more than just the lexical similarity of languages. For a given language, we are able to predict zero-shot performance, that increases on a logarithmic scale with the number of few-shot target language data points.
translated by 谷歌翻译